You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- README: mark all features as working, update security section
- README: document API key auth, nginx proxy, encryption
- CHANGELOG: add security hardening entries
Copy file name to clipboardExpand all lines: README.md
+77-16Lines changed: 77 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
3
3
This project deploys a **Telegram chatbot** on AWS using Infrastructure as Code (Terraform). It provisions **S3 buckets** for storing archived chat history, **DynamoDB tables** for managing user sessions, **Lambda** for serverless processing, and **API Gateway** for real-time webhook integration with Telegram.
4
4
5
-
> **⚠️ Status**: Currently in development. Ollama AI integration (backend models) is not yet implemented. Session management, commands, archive features, and Telegram connectivity are fully functional.
5
+
> **Status**: Fully functional. Ollama AI integration is live on EC2 with API key authentication. Session management, commands, archive features, and AI chat are all working.
6
6
7
7
---
8
8
@@ -19,6 +19,7 @@ This project deploys a **Telegram chatbot** on AWS using Infrastructure as Code
19
19
*[Bot Commands](#bot-commands)
20
20
*[Project Structure](#project-structure)
21
21
*[Module Structure](#module-structure)
22
+
*[External API Integration](#external-api-integration)
22
23
*[Data Storage](#data-storage)
23
24
*[Observability](#observability)
24
25
*[Verification](#verification)
@@ -39,24 +40,24 @@ This project creates a serverless Telegram bot running on AWS. When users send m
39
40
- ✅ Archive system (`/archive`, `/listarchives`, `/export`, file import)
40
41
- ✅ DynamoDB for live session storage
41
42
- ✅ S3 for archived session storage
42
-
-⏳ Ollama AI integration (planned for next phase)
43
+
-✅ Ollama AI integration with API key authentication
Creates an EC2 instance running Ollama for AI inference.
426
+
427
+
| Variable | Description | Default |
428
+
|----------|-------------|---------|
429
+
|`instance_name`| Name tag for the instance | Required |
430
+
|`instance_type`| EC2 instance type |`t3.large`|
431
+
|`ollama_model`| Model to pull on first boot |`tinyllama`|
432
+
|`models_s3_bucket`| S3 bucket for model persistence | Required |
433
+
|`ssh_allowed_cidr`| CIDR for SSH access |`0.0.0.0/0`|
434
+
435
+
---
436
+
437
+
## External API Integration
438
+
439
+
### Ollama (Self-Hosted LLM Inference)
440
+
441
+
The bot integrates with [Ollama](https://ollama.com), a self-hosted large language model inference server running on an EC2 instance. When users send chat messages, Lambda calls the Ollama API over HTTP to generate AI responses.
0 commit comments