Simple bot to interact with Open Source LLM's running in Ollama using Telegram
Find a file
Alexey Skobkin d65f61db82
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
Rolling back summarizing to Mistral
2024-05-06 00:28:41 +00:00
bot Rolling back summarizing to Mistral 2024-05-06 00:28:41 +00:00
extractor fix #15 slog usage. Also adding more logging. 2024-03-12 22:07:22 +03:00
llm Changing default model to LLaMa3 (#22) 2024-05-06 00:15:36 +00:00
stats Closes #14. Adding inline queries. Also small refactoring of context prompt based on RequestContext. 2024-03-13 01:18:01 +03:00
.drone.yml Also changing image tag for build step. 2024-03-10 05:57:59 +03:00
.gitignore initial. Draft of SIMPLE LLM bot for Telegram chat. 2024-03-08 05:18:45 +03:00
Dockerfile Changing build image to Alpine to avoid linking errors. 2024-03-10 05:55:28 +03:00
go.mod Refactoring structure from single file to several separated services. Adding new feature: "summarize" to generate bullet points for provided link. 2024-03-10 04:51:01 +03:00
go.sum Refactoring structure from single file to several separated services. Adding new feature: "summarize" to generate bullet points for provided link. 2024-03-10 04:51:01 +03:00
LICENSE LICENSE file. 2024-03-08 05:19:26 +03:00
main.go Fix #17. Implementing slog-based logger for telego and passing it into the library. 2024-03-12 23:05:52 +03:00
README.md Adding API URL suffix to the README.md. 2024-03-12 22:13:32 +03:00

telegram-ollama-reply-bot

Build Status

Usage

Docker

docker run \
  -e OLLAMA_TOKEN=123 \
  -e OLLAMA_BASE_URL=http://ollama.localhost:11434/v1 \
  -e TELEGRAM_TOKEN=12345 \
  skobkin/telegram-llm-bot