telegram-bot
workflowv1.0.0Telegram bot that answers messages using a local LLM (Ollama)
Install
Then run locally:
Configure LLM provider in ~/.kdeps/config.yaml (created automatically on first run).
README
Telegram Bot Example
A Telegram bot that receives user messages and replies using a local LLM (Ollama llama3.2:3b).
Features
- Polls Telegram for new messages every second
- Sends each message to a local LLM with a custom persona ("Kodi")
- Replies to the user automatically on Telegram
Prerequisites
- A Telegram bot token — create one with @BotFather (
/newbot) - Docker or a local kdeps runtime
Setup
1. Get a Bot Token
Talk to @BotFather on Telegram:
/newbot
→ Name: My Kodi Bot
→ Username: my_kodi_bot
→ Token: 1234567890:AAH...
2. Run the Bot
export TELEGRAM_BOT_TOKEN="1234567890:AAH..."
# From the examples/telegram-bot directory
kdeps run workflow.yaml
# Or from the project root
kdeps run examples/telegram-bot/workflow.yaml
You should see:
Bot input sources active:
• Telegram (polling)
Starting bot runners... (press Ctrl+C to stop)
3. Chat with the Bot
Open Telegram, find your bot by its username, and send it a message. It will reply using the LLM.
Structure
telegram-bot/
├── workflow.yaml # Bot source config, Telegram credentials, Ollama model
├── components/
│ └── botreply/
│ └── component.yaml # .komponent: sends reply through the bot session
└── resources/
└── llm.yaml # LLM chat resource — receives input('message'), applies persona
The botreply component encapsulates the run.botReply executor and is auto-loaded from the
components/ directory when the workflow is parsed. This makes the component independently
distributable — install it once with kdeps component install botreply and reuse it across
multiple Telegram bot workflows.
To package the component as a shareable .komponent archive:
kdeps package components/botreply --output components/
Key Expressions
| Expression | Description |
|---|---|
input('message') | The text the Telegram user sent |
input('chatId') | Telegram chat ID (for targeted replies) |
input('userId') | Telegram user ID |
input('platform') | Always "telegram" for this bot |
get('llm') | The LLM-generated reply text |
Customization
Use a Different Model
# workflow.yaml
agentSettings:
models:
- llama3.1:8b # Smarter, slower
# resources/llm.yaml
chat:
model: llama3.1:8b
Change the Persona
Edit the scenario block in resources/llm.yaml:
scenario:
- role: assistant
prompt: |
You are Aria, a customer support agent for Acme Corp.
Help users with order status, returns, and product questions.
Add Discord Support
Add a discord block to workflow.yaml:
bot:
executionType: polling
discord:
botToken: "{{ env('DISCORD_BOT_TOKEN') }}"
telegram:
botToken: "{{ env('TELEGRAM_BOT_TOKEN') }}"
See Also
- Stateless Bot Example — One-shot stdin/stdout execution
- Input Sources Documentation
- Bot Tutorial
- Voice Assistant Example — Audio/microphone input with TTS output
Versions
| Version | Published | Status |
|---|---|---|
| 1.0.0 | 4/11/2026 | active |
Details
- Author
- kdeps
- License
- Apache-2.0
- Latest Version
- 1.0.0
- Published
- 4/11/2026