chatbot
workflowv1.0.0Simple LLM chatbot
Install
kdeps registry install chatbot
Then run locally:
kdeps exec chatbot
Configure LLM provider in ~/.kdeps/config.yaml (created automatically on first run).
README
Chatbot Example
Simple LLM chatbot using the unified API.
Features
- ✅ YAML configuration
- ✅ Unified API (
get()function) - ✅ LLM chat with Ollama
- ✅ JSON response
- ✅ Validation with preflight checks
Run Locally
# From examples/chatbot directory
kdeps run workflow.yaml
# Or from root
kdeps run examples/chatbot/workflow.yaml
Test
curl -X POST http://localhost:16395/api/v1/chat \
-H "Content-Type: application/json" \
-d '{"q": "What is artificial intelligence?"}'
Response
{
"data": {
"answer": "Artificial intelligence (AI) is..."
},
"query": "What is artificial intelligence?"
}
Structure
chatbot/
├── workflow.yaml # Main workflow configuration
└── resources/
├── llm.yaml # LLM chat resource
└── response.yaml # API response resource
Key Concepts
Unified API
Uses get() for all data access:
# Get query parameter
prompt: "{{ get('q') }}"
# Get LLM response from previous resource
data: get('llmResource')
# Validation
validations:
- get('q') != ''
Auto-Detection
get() automatically detects the data source:
get('q')→ Query parameterget('llmResource')→ Resource outputget('user_data')→ Memory storage
Versions
| Version | Published | Status |
|---|---|---|
| 1.0.0 | 4/11/2026 | active |
Details
- Author
- kdeps
- License
- Apache-2.0
- Latest Version
- 1.0.0
- Published
- 4/11/2026
Tags
llmchatbot