The ChatBot Toolkit is designed for rapid development of LLM-powered chatbot applications β from simple chat interfaces to advanced RAG systems and agentic workflows using tools.
This toolkit provides plug-and-play modules that streamline chatbot and LLM application development, making the process faster, more efficient, and scalable. By using pre-built, reusable components, we:
β
Reduce development time β³
β
Improve code quality π₯
β
Minimize errors from rushed coding π‘
This repository is a modular toolkit with multiple standalone modules to help you build LLM-based systems, including:
/chatbot_toolkit/chatbotβ Basic to advanced chatbot frameworks π€/chatbot_toolkit/ragβ Retrieval-Augmented Generation pipelines π/chatbot_toolkit/toolsβ Agentic workflows with tool integration π οΈ
Each module is self-contained, extensible, and designed to be used individually or combined together.
To ensure smooth operation, the project requires certain environment variables. These should be stored in a .env file at the root of the project.
Create a .env file in the project root and add the following:
# πΉ OpenRouter API Configuration
OPENAI_API_KEY="your_openrouter_api_key_here"
OPENAI_API_BASE="https://openrouter.ai/api/v1"
# πΉ LangSmith Tracing (for debugging & monitoring model behavior)
LANGSMITH_API_KEY="your_langsmith_api_key_here"
LANGSMITH_TRACING="true"
LANGSMITH_ENDPOINT="https://api.smith.langchain.com"
LANGSMITH_PROJECT="your_project_name_here"OPENAI_API_KEYβ API key for authentication with OpenRouter.OPENAI_API_BASEβ Specifies OpenRouter as the API endpoint (LangChain doesnβt support it natively).
LANGSMITH_API_KEYβ API key for LangSmith to enable tracing.LANGSMITH_TRACINGβ Enables/disables LangSmith tracing. (true= enabled)LANGSMITH_ENDPOINTβ Endpoint for sending trace data.LANGSMITH_PROJECTβ Your project name in LangSmith.
The toolkit includes a Dockerized development environment to simplify setup.
After launching the container, you can manually run any module you'd like to work on.
- Build the Docker image
docker build -t chatbot .- Start the container using Docker Compose
docker-compose upβ Make sure your
.envfile is in the root directory before starting.
Once the container is up and running:
# Access the container
docker exec -it <container_name_or_id> /bin/bash
# Run a specific module using its test script
python3 /chatbot_toolkit/chatbot/test.py # πΉ ChatBot module
python3 /chatbot_toolkit/rag/test.py # πΉ RAG module
python3 /chatbot_toolkit/tools/test.py # πΉ Agentic tools moduleEach module is designed to be:
- Plug-and-play
- Easy to extend
- Useful for real-world LLM apps
β¨ Rapid prototyping of LLM-powered apps
π§© Modular architecture for flexibility
π³ Dev-containerized for easy onboarding
π Production-ready with LangSmith tracing support
π Secure & clean separation of environment variables
Want to add a new module or improve an existing one? PRs are welcome! Please follow the modular design style and keep test.py as the entry point for new components.