Skip to content

Extract prompt building logic into build_all_requests() function#88

Open
qianchongyang wants to merge 1 commit intoLLMSQL:mainfrom
qianchongyang:bounty/20260319-llmsql-llmsql-benchmark-62
Open

Extract prompt building logic into build_all_requests() function#88
qianchongyang wants to merge 1 commit intoLLMSQL:mainfrom
qianchongyang:bounty/20260319-llmsql-llmsql-benchmark-62

Conversation

@qianchongyang
Copy link

Problem

Currently, prompt creation logic is duplicated across multiple inference functions. This leads to code redundancy and makes it difficult to apply consistent chat templates or modify prompt formats.

Solution

Created a new build_all_requests() function that centralizes prompt generation. The function handles:

  • Building all prompts from input data
  • Applying chat templates when specified

All inference functions now call this shared function instead of duplicating the logic.

Validation

  • Existing tests pass without modification
  • New function is imported and used by all inference functions
  • Chat template application is consistent across all inference paths

Fixes #62

Created a shared build_all_requests() function in utils.py that handles:
- Building prompts from questions and tables
- Optional chat template application via tokenizer

Updated inference_vllm.py and inference_api.py to use the new function,
removing duplicate prompt building logic.

Fixes LLMSQL#62
@DzmitryPihulski
Copy link
Collaborator

Hi @qianchongyang , thank you for your contribution!

Your work will be merged after check.

@codecov
Copy link

codecov bot commented Mar 21, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Move creation of prompts to a function

2 participants