Skip to content

Latest commit

 

History

History
73 lines (51 loc) · 1.5 KB

File metadata and controls

73 lines (51 loc) · 1.5 KB

LLMSQL CLI Manual

The llmsql CLI provides two main workflows:

  • inference: generate SQL predictions with a selected backend.
  • evaluate: score predictions on the LLMSQL benchmark.

Command Structure

llmsql <command> [options]

Available commands:

  • llmsql inference transformers ...
  • llmsql inference vllm ...
  • llmsql inference api ...
  • llmsql evaluate ...

Inference Commands

1) Transformers backend

llmsql inference transformers \
  --model-or-model-name-or-path Qwen/Qwen2.5-1.5B-Instruct \
  --output-file outputs/preds_transformers.jsonl

This command calls inference_transformers().

2) vLLM backend

llmsql inference vllm \
  --model-name Qwen/Qwen2.5-1.5B-Instruct \
  --output-file outputs/preds_vllm.jsonl

This command calls inference_vllm().

3) OpenAI-compatible API backend

llmsql inference api \
  --model-name gpt-5-mini \
  --base-url https://api.openai.com/v1 \
  --output-file outputs/preds_api.jsonl

This command calls inference_api().

Evaluation Command

llmsql evaluate --outputs outputs/preds_transformers.jsonl

This command calls evaluate().

Help

Use built-in help to see all options:

llmsql --help
llmsql inference --help
llmsql inference transformers --help
llmsql inference vllm --help
llmsql inference api --help
llmsql evaluate --help