Skip to content

Add vLLM and LM Studio to LLM providers in backend #564

@sal-uva

Description

@sal-uva

Currently, only Ollama is supported, but since vLLM and LM Studio both use OpenAI-like APIs, they can easily be added and their models be loded with /v1/models. See llm_adapter.py.

Metadata

Metadata

Assignees

Labels

(mostly) back-endPrimarily involves the back-end (e.g. post-processors, scheduler or crawlers).enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions