Telegram LLM Bot #87

Merged
skobkin merged 3 commits from feature_telegram_llm_bot into master 2024-03-10 03:06:53 +00:00
3 changed files with 25 additions and 0 deletions
Showing only changes of commit 957333cf5d - Show all commits

View file

@ -77,6 +77,7 @@ Not every stack is tested to fully work.
| Speedtest | ✅ | `adolfintel/speedtest` | Libre speed test implementation. | [Website](https://librespeed.org), [Github](https://github.com/librespeed/speedtest) |
| Synapse | ✅ | `matrixdotorg/synapse` | Matrix reference server written in Python. | [Website](https://matrix.org/docs/projects/server/synapse), [Github](https://github.com/matrix-org/synapse), [Installation and configuration](https://matrix-org.github.io/synapse/latest/setup/installation.html) |
| Syncthing | ✅ | `linuxserver/syncthing` | P2P file synchronization daemon. | [Website](https://syncthing.net), [Github](https://github.com/syncthing/syncthing) |
| Telegram LLM Bot | ✅ | `skobkin/telegram-llm-bot` | Simple Telegram bot to interact with LLM running in Ollama | [Forgejo repository](https://git.skobk.in/skobkin/telegram-ollama-reply-bot) |
| Telegram RSS Bot | ✅ | `miroslavsckaya/tg-rss-bot` | Telegram RSS Bot by @Miroslavsckaya. | [Gitea](https://git.skobk.in/Miroslavsckaya/tg_rss_bot/), [Github Mirror](https://github.com/Miroslavsckaya/tg_rss_bot) |
| Tor OBFS4 Bridge | ✅ | `thetorproject/obfs4-bridge` | Tor OBFS4 Bridge for Tor blocking bypass. | [Website](https://community.torproject.org/relay/setup/bridge/), [Gitlab](https://gitlab.torproject.org/tpo/anti-censorship/docker-obfs4-bridge), [Manual](https://community.torproject.org/relay/setup/bridge/docker/) |
| Tor Privoxy | ✅ | `registry.gitlab.com/skobkin/torproxy-obfs4` | Tor image with integrated privoxy and OBFS4 bridge support. | [Original image Github](https://github.com/dperson/torproxy), [OBFS4 support image Gitlab](https://gitlab.com/skobkin/torproxy-obfs4) |

View file

@ -0,0 +1,8 @@
# see https://hub.docker.com/r/skobkin/telegram-llm-bot
TELEGRAM_TOKEN=12345
OLLAMA_TOKEN=12345
OLLAMA_BASE_URL=http://host.docker.internal:11434
LOG_MAX_SIZE=5m
LOG_MAX_FILE=5

View file

@ -0,0 +1,16 @@
# https://hub.docker.com/r/skobkin/telegram-llm-bot
version: '3.9'
services:
drone:
image: "skobkin/telegram-llm-bot:${IMAGE_VERSION:latest}"
container_name: telegram-llm-bot
extra_hosts:
- "host.docker.internal:host-gateway"
env_file: .env
restart: unless-stopped
logging:
driver: "json-file"
options:
max-size: "${LOG_MAX_SIZE:-5m}"
max-file: "${LOG_MAX_FILE:-5}"