The llmsql CLI provides two main workflows:
inference: generate SQL predictions with a selected backend.evaluate: score predictions on the LLMSQL benchmark.
llmsql <command> [options]Available commands:
llmsql inference transformers ...llmsql inference vllm ...llmsql inference api ...llmsql evaluate ...
llmsql inference transformers \
--model-or-model-name-or-path Qwen/Qwen2.5-1.5B-Instruct \
--output-file outputs/preds_transformers.jsonlThis command calls inference_transformers().
llmsql inference vllm \
--model-name Qwen/Qwen2.5-1.5B-Instruct \
--output-file outputs/preds_vllm.jsonlThis command calls inference_vllm().
llmsql inference api \
--model-name gpt-5-mini \
--base-url https://api.openai.com/v1 \
--output-file outputs/preds_api.jsonlThis command calls inference_api().
llmsql evaluate --outputs outputs/preds_transformers.jsonlThis command calls evaluate().
Use built-in help to see all options:
llmsql --help
llmsql inference --help
llmsql inference transformers --help
llmsql inference vllm --help
llmsql inference api --help
llmsql evaluate --help