Skip to content

Commit b35b168

Browse files
committed
Updated V 1.3.0 Docs
1 parent 49ffc49 commit b35b168

2 files changed

Lines changed: 230 additions & 27 deletions

File tree

README.md

Lines changed: 73 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,9 @@ Designed for developers, researchers, and power users, DeepShell abstracts away
1515
* **Multi-LLM Support:**
1616
* Seamlessly connect to **Ollama** servers (local or remote).
1717
* Integrate with the **Google Gemini API**.
18+
* Connect to **OpenRouter.ai** for access to 200+ LLM models from various providers.
19+
* Advanced model selection with pagination and sorting (free models first).
20+
* Multi-key management with nicknames for different OpenRouter accounts.
1821
* **Conversational Memory & Customization:**
1922
* Engage in multi-turn conversations using the **interactive mode** (`-i`).
2023
* Set the conversation history limit (defaults to 25 turns).
@@ -28,11 +31,18 @@ Designed for developers, researchers, and power users, DeepShell abstracts away
2831
* Easily switch between configured LLM services (`-l`).
2932
* Quickly jump back to the previously used LLM service (`-j`).
3033
* List available models from your connected LLM service and change the default model per service (`-m`).
31-
* **Advanced Gemini API Key Management:**
32-
* Store and manage multiple Gemini API keys with user-defined nicknames.
33-
* Easily add new keys or set an active key from your stored list (`-set-key`).
34-
* Display the currently active Gemini API key's nickname and value (`-show-key`).
35-
* Quickly check your Gemini API key status and get a link to your usage dashboard (`-gq`).
34+
* **Advanced API Key Management:**
35+
* Store and manage multiple API keys with user-defined nicknames for both Gemini and OpenRouter.
36+
* Unified key management interface (`-set-key`) supporting both services.
37+
* Display the currently active API key for any LLM service (`-show-key`).
38+
* Quick active configuration summary showing current LLM, model, and API key (`-a`).
39+
* Gemini-specific quota checking and usage dashboard access (`-gq`).
40+
* **Configuration Backup & Migration:**
41+
* Export complete configuration to encrypted files (`-b`) with password protection.
42+
* Import configuration from backup files (`-c`) with confirmation prompts.
43+
* Files saved to Downloads folder in secure binary format (unreadable as text).
44+
* Future-proof design with version metadata for cross-version compatibility.
45+
* Perfect for backing up settings, sharing configurations, or migrating between systems.
3646
* **Intuitive User Experience:**
3747
* Send queries directly from your command line (`-q`).
3848
* Beautiful Markdown rendering for LLM responses in the terminal with native C implementation.
@@ -124,23 +134,40 @@ Your settings will be saved to `~/.deepshell/deepshell.conf`.
124134
./deepshell -m (or --model-change)
125135
```
126136
127-
### Gemini-Specific Commands
137+
### API Key Management
128138
129-
**Interactively manage Gemini API keys** (add, remove, set active)
139+
**Interactively manage API keys for LLM services** (Gemini or OpenRouter)
130140
```bash
131141
./deepshell -set-key (or --set-api-key)
132142
```
133143
134-
**Show the active Gemini API key nickname and value**
144+
**Show the active API key for current LLM service**
135145
```bash
136146
./deepshell -show-key (or --show-api-key)
137147
```
138148
149+
**Quick summary of active LLM, model, and API key**
150+
```bash
151+
./deepshell -a (or --active-config)
152+
```
153+
139154
**Check Gemini API key status and get quota info**
140155
```bash
141156
./deepshell -gq (or --gemini-quota)
142157
```
143158
159+
### Configuration Backup & Migration
160+
161+
**Export configuration to encrypted backup file**
162+
```bash
163+
./deepshell -b mybackup.config (or --export mybackup.config)
164+
```
165+
166+
**Import configuration from encrypted backup file**
167+
```bash
168+
./deepshell -c mybackup.config (or --import mybackup.config)
169+
```
170+
144171
### Configuration & Info
145172
146173
**Display the currently active configuration details**
@@ -175,25 +202,39 @@ DeepShell stores its configuration in a JSON file located at `~/.deepshell/deeps
175202
An example configuration might look like this:
176203
```json
177204
{
178-
"active_llm_service": "gemini",
179-
"previous_active_llm_service": "ollama",
180-
"llm_services": {
181-
"ollama": {
182-
"server_address": "http://localhost:11434",
183-
"model": "llama3:latest",
184-
"render_markdown": true
185-
},
186-
"gemini": {
187-
"api_keys": [
188-
{
189-
"nickname": "personal-key",
190-
"key": "BIsa8y..."
191-
}
192-
],
193-
"active_api_key_nickname": "personal-key",
194-
"model": "models/gemini-1.5-flash",
195-
"render_markdown": true
196-
}
205+
"active_llm_service": "openrouter",
206+
"previous_active_llm_service": "gemini",
207+
"interactive_history_limit": 25,
208+
"enable_streaming": false,
209+
"show_progress_animation": true,
210+
"ollama": {
211+
"server_address": "http://localhost:11434",
212+
"model": "llama3:latest",
213+
"render_markdown": true
214+
},
215+
"gemini": {
216+
"api_keys": [
217+
{
218+
"nickname": "personal-key",
219+
"key": "AIza..."
220+
}
221+
],
222+
"active_api_key_nickname": "personal-key",
223+
"model": "models/gemini-1.5-flash",
224+
"render_markdown": true
225+
},
226+
"openrouter": {
227+
"api_keys": [
228+
{
229+
"nickname": "work-account",
230+
"key": "sk-or-v1-..."
231+
}
232+
],
233+
"active_api_key_nickname": "work-account",
234+
"model": "openai/gpt-4o",
235+
"site_url": "https://myproject.com",
236+
"site_name": "My Project",
237+
"render_markdown": true
197238
}
198239
}
199240
```
@@ -212,6 +253,11 @@ The C version offers several advantages over interpreted languages:
212253
----
213254
* **Ollama:** Connect to any Ollama instance serving models like Llama, Mistral, etc.
214255
* **Google Gemini:** Access Gemini models (e.g., `gemini-1.5-pro`, `gemini-1.5-flash`) via the Google AI Studio API.
256+
* **OpenRouter.ai:** Access 200+ models from providers like OpenAI, Anthropic, Meta, Google, and more:
257+
* GPT-4, GPT-3.5, Claude, Llama, Mixtral, Gemma, and many others
258+
* Free and paid models with transparent pricing
259+
* Advanced model browser with pagination and sorting (free models first)
260+
* Multi-account support with API key nicknames
215261

216262
---
217263

RELEASE_NOTES_1.3.0.md

Lines changed: 157 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,157 @@
1+
# 🚀 DeepShell v1.3.0 - Major Feature Release
2+
3+
## 🎯 Overview
4+
5+
We're excited to announce **DeepShell v1.3.0**, a major feature release that significantly expands DeepShell's capabilities with **OpenRouter.ai integration**, **secure configuration backup/migration**, and numerous enhancements that make DeepShell the most comprehensive LLM command-line interface available.
6+
7+
## ✨ New Features
8+
9+
### 🌐 OpenRouter.ai Integration
10+
**Complete third LLM service with full feature parity**
11+
12+
- **200+ Model Access**: Connect to OpenAI, Anthropic, Meta, Google, Mistral, and many more providers through a single interface
13+
- **Advanced Model Browser**:
14+
- Paginated model selection (15 models per page)
15+
- Smart sorting: **Free models listed first**, then alphabetical
16+
- Navigate with `n` (next), `p` (previous), `q` (quit)
17+
- **Multi-Key Management**: Store multiple OpenRouter API keys with custom nicknames
18+
- **Full Feature Support**: All DeepShell features work seamlessly with OpenRouter
19+
- **Free Model Support**: Easy access to free models like `openai/gpt-4o-mini:free`
20+
21+
### 🔐 Configuration Backup & Migration
22+
**Enterprise-grade configuration management**
23+
24+
- **Encrypted Export** (`-b filename.config`):
25+
- Password-protected with confirmation
26+
- Secure binary format (completely unreadable as text)
27+
- Saves to Downloads folder automatically
28+
- Includes ALL settings, API keys, and configurations
29+
- **Secure Import** (`-c filename.config`):
30+
- Password verification
31+
- Confirmation prompt before overwriting
32+
- Future-proof with version metadata
33+
- **Perfect for**:
34+
- Backing up your complete DeepShell setup
35+
- Migrating between development machines
36+
- Sharing team configurations securely
37+
- Disaster recovery
38+
39+
### 🔑 Enhanced API Key Management
40+
**Unified and powerful key management across all services**
41+
42+
- **Service-Agnostic Commands**:
43+
- `-set-key`: Now manages both Gemini AND OpenRouter keys
44+
- `-show-key`: Shows active key for current LLM service
45+
- `-a` / `--active-config`: Quick summary of active LLM, model, and API key
46+
- **Consistent UX**: OpenRouter key management now matches Gemini's interface exactly
47+
- **Smart Validation**: Both services support nickname-based multi-key workflows
48+
49+
## 🛠️ Improvements & Fixes
50+
51+
### User Experience Enhancements
52+
- **Alphabetized Help Menu**: All command-line options now sorted alphabetically for easier reference
53+
- **Updated Interactive Logo**: Now properly shows "Multi-LLM Support (Ollama, Gemini, and OpenRouter)"
54+
- **Improved Error Handling**: Better error messages and validation throughout
55+
- **Enhanced Model Selection**: Improved pagination and user-friendly navigation
56+
57+
### Technical Improvements
58+
- **Auto-Setup Bypass**: Export/import flags now correctly bypass automatic setup
59+
- **Future-Proof Design**: Export format includes version metadata for cross-version compatibility
60+
- **Binary Security**: Export files are true binary format, preventing accidental text viewing
61+
- **Memory Management**: Enhanced memory handling and cleanup throughout
62+
63+
### Bug Fixes
64+
- Fixed import flag being overridden by default configuration setup
65+
- Resolved model selection issues for OpenRouter
66+
- Corrected API key management inconsistencies
67+
- Fixed various edge cases in configuration handling
68+
69+
## 📊 What's Included
70+
71+
### Complete Configuration Export
72+
When you export your configuration, **everything** is included:
73+
- All LLM service configurations (Ollama, Gemini, OpenRouter)
74+
- All API keys with their nicknames
75+
- Model selections for each service
76+
- Interactive settings (history limit, streaming, animation)
77+
- Markdown rendering preferences
78+
- Server addresses and site attribution
79+
80+
### OpenRouter Model Categories
81+
Access to major model families including:
82+
- **OpenAI**: GPT-4, GPT-3.5 (including free variants)
83+
- **Anthropic**: Claude 3.5 Sonnet, Claude 3 Haiku
84+
- **Meta**: Llama 3.1, Llama 3.2 (including free versions)
85+
- **Google**: Gemma 2, PaLM models
86+
- **Mistral**: Mixtral, Mistral 7B
87+
- **And 190+ more models** from various providers
88+
89+
## 🚀 Getting Started with New Features
90+
91+
### Try OpenRouter
92+
```bash
93+
# Setup OpenRouter
94+
./deepshell -s
95+
# Select option 1 (Manage LLM Services)
96+
# Choose 3 (OpenRouter)
97+
98+
# Quick model change with new pagination
99+
./deepshell -m
100+
# Browse models with n/p navigation, free models listed first
101+
102+
# Manage multiple OpenRouter keys
103+
./deepshell -set-key
104+
# Choose 2 (OpenRouter)
105+
```
106+
107+
### Backup Your Configuration
108+
```bash
109+
# Export your complete setup
110+
./deepshell -b my-deepshell-backup.config
111+
# Enter password twice for protection
112+
113+
# Import on another machine
114+
./deepshell -c my-deepshell-backup.config
115+
# Enter password and confirm overwrite
116+
```
117+
118+
### Quick Status Check
119+
```bash
120+
# See your current setup at a glance
121+
./deepshell -a
122+
# Shows: LLM Service, Model, API Key (with nickname)
123+
```
124+
125+
## 🔧 Migration Guide
126+
127+
### From v1.2.x
128+
No breaking changes! Your existing configuration will work seamlessly. New features are additive.
129+
130+
### Recommended Actions
131+
1. **Backup First**: `./deepshell -b v1-2-backup.config`
132+
2. **Try OpenRouter**: Add it as a third LLM option
133+
3. **Test New Commands**: Explore `-a` for quick status checks
134+
135+
## 🎯 What's Next
136+
137+
DeepShell v1.3.0 establishes a solid foundation for multi-LLM management with enterprise-grade backup capabilities. Future releases will focus on:
138+
- Additional LLM service integrations
139+
- Advanced conversation management features
140+
- Enhanced interactive mode capabilities
141+
- Performance optimizations
142+
143+
## 📚 Documentation
144+
145+
Full documentation with examples and tutorials is available in our [README.md](README.md).
146+
147+
## 🙏 Acknowledgments
148+
149+
Special thanks to the community for feature requests, bug reports, and testing that made this release possible.
150+
151+
---
152+
153+
**Download DeepShell v1.3.0** from the [releases page](https://github.com/ashes00/deepshell/releases) or build from source.
154+
155+
**Need Help?** Check our [README.md](README.md) or open an issue on GitHub.
156+
157+
Happy Querying! 🤖✨

0 commit comments

Comments
 (0)