Skip to content

feat: PromptSearchList batch workflow support & image linking v3.2.2#139

Merged
vitosans merged 7 commits intomainfrom
feature/prompt-search-batch-linking
Apr 10, 2026
Merged

feat: PromptSearchList batch workflow support & image linking v3.2.2#139
vitosans merged 7 commits intomainfrom
feature/prompt-search-batch-linking

Conversation

@vitosans
Copy link
Copy Markdown
Contributor

Summary

  • PromptSearchList node overhaul: Fixed empty list crash, added partial tag matching, preview/count outputs, skip_multipart and LoRA-only filters, newline collapsing for StringOutputList compatibility
  • Batch image linking: Solved race condition where all batch-generated images linked to the last prompt. New FIFO queue in prompt tracker preserves encode→save ordering so each image links to the correct prompt
  • Image monitor multi-strategy linking: 5-tier fallback chain (queue pop → metadata lookup → snapshot → live tracker → DB fallback) handles both batch and single-image workflows

Test plan

  • PromptSearchList returns correct results for text and tag searches
  • Empty search results don't crash ComfyUI (OUTPUT_IS_LIST safe)
  • Clip_ multi-part prompts filtered correctly
  • LoRA-only prompts filtered correctly
  • Newlines collapsed — StringOutputList treats each DB entry as one prompt
  • Batch workflow: 10 different prompts → 10 images → each linked to correct prompt via queue
  • All 141 unit tests pass (21 new: 6 queue + 15 filter tests)

- Fix empty list crash (OUTPUT_IS_LIST requires at least one element)
- Add partial tag matching (LIKE instead of exact)
- Add preview and count outputs for batch visibility
- Add skip_multipart filter for Clip_N video prompts
- Add LoRA-only prompt filter
- Collapse newlines for StringOutputList compatibility
Allows search_prompts to use LIKE matching for tags instead of exact.
Used by PromptSearchList for discovery-style search.
Existing callers default to exact matching (unchanged behavior).
Batch workflows encode all prompts first, then save all images later.
The queue preserves encode order so each image links to the correct
prompt by position (pop on save matches push on encode).
Previous approach relied on live prompt tracker state, which was
always stale for batch workflows (all images linked to last prompt).

New 5-strategy priority chain:
1. FIFO queue pop (batch-correct by position)
2. Metadata-based DB lookup (extracts prompt from PNG)
3. Prompt snapshot at file-creation time
4. Live prompt tracker
5. Most recent DB prompt fallback
- 6 queue tests: FIFO ordering, batch simulation, exhaustion
- 15 filter tests: newline collapse, Clip_ filter, LoRA-only filter
- Fix _make_tracker helper to include queue fields
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 09bbc71d5e

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

# Strategy 1: Pop from batch queue (most reliable for batch workflows).
# Prompts are queued during CLIP encoding in order; images save in
# the same order, so FIFO pop gives the correct prompt per image.
current_prompt = self.prompt_tracker.pop_next_prompt()
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Prevent timer races from consuming queued prompts out of order

on_created starts a separate threading.Timer per image, so images created in sequence can enter process_new_image in a different order. Because this line always does a FIFO pop before any verification, whichever timer thread runs first consumes the next prompt context, which can swap prompt→image associations for batch outputs generated close together.

Useful? React with 👍 / 👎.

"""
with self._queue_lock:
if self._prompt_queue:
ctx = self._prompt_queue.pop(0)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Discard stale queue entries before returning prompt context

This pop path returns the oldest queued prompt with no expiry or validity check. set_current_prompt appends every execution, but prompt cleanup only prunes active_prompts, not _prompt_queue, so canceled/expired executions can remain and later be linked to new images, causing incorrect associations and unbounded queue growth over long sessions.

Useful? React with 👍 / 👎.

return ([],)
return {
"ui": {"text": ["No results found"]},
"result": ([""], preview, count),
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Return zero batch items when search yields no matches

In the no-results branch, the node emits "" as one list item even though count is 0. Since prompts is an OUTPUT_IS_LIST output, downstream batch nodes will still run once with an empty prompt text instead of skipping execution, which changes the semantics of an empty search result and can trigger unintended generations.

Useful? React with 👍 / 👎.

- Replace backtracking-prone LoRA-only regex with subtraction approach
  (re.sub removes tags, check if anything remains). Fixes CodeQL alert.
- Add expiry check to pop_next_prompt — skips stale entries to prevent
  incorrect linking from accumulated queue entries across sessions.
- Cleanup thread now also prunes expired queue entries.
@vitosans vitosans merged commit 8146250 into main Apr 10, 2026
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants