← Stack
Local LLM
AI that doesn't phone home.
SINCE
2025
PROFICIENCY
80%
GelbIT and FTC Dashboard both use local LLMs via Ollama. For GelbIT, I fine-tuned Gemma 3 on German recycling rules, reaching 94% classification accuracy. Running models locally means no API costs, no rate limits, no data leaving the machine, and sub-second response times on consumer hardware.
I USE IT FOR
- Privacy-sensitive AI features
- Offline-capable AI applications
- Cost-sensitive deployments
- Fine-tuned models for specific tasks
PROJECTS
CODE SAMPLE
import ollama
response = ollama.generate(
model="gemma3",
prompt="Classify this waste item: empty yogurt cup",
options={"temperature": 0.1},
)
print(response["response"])
# → yellow bin (Gelber Sack)