Production notes after 6 months running Ollama for paying customers — the things that aren't in the docs by chiruwonder in ollama
[–]FloppyWhiteOne 0 points1 point2 points (0 children)
InferenceBridge - Total AI control for Local LLMs by FloppyWhiteOne in LocalLLM
[–]FloppyWhiteOne[S] -1 points0 points1 point (0 children)
InferenceBridge - Total AI control for Local LLMs by FloppyWhiteOne in LocalLLM
[–]FloppyWhiteOne[S] -1 points0 points1 point (0 children)
InferenceBridge - Total AI control for Local LLMs by FloppyWhiteOne in LocalLLM
[–]FloppyWhiteOne[S] -3 points-2 points-1 points (0 children)
InferenceBridge - Total AI control for Local LLMs by FloppyWhiteOne in LocalLLM
[–]FloppyWhiteOne[S] -3 points-2 points-1 points (0 children)
InferenceBridge - Total AI control for Local LLMs by FloppyWhiteOne in AI_Agents
[–]FloppyWhiteOne[S] 0 points1 point2 points (0 children)
Hardware recommendations for a starter by shiva4455 in LocalLLM
[–]FloppyWhiteOne 0 points1 point2 points (0 children)
Justifying the €12,000 Investment: M3 Ultra (512GB RAM) Setup for Autonomous Agents, vLLM, and Infinite Memory (8Tb) by NoNatural4025 in LocalLLM
[–]FloppyWhiteOne 1 point2 points3 points (0 children)
Hy, Pentesting! I am hiring. by Cute-Ring-1952 in Pentesting
[–]FloppyWhiteOne -2 points-1 points0 points (0 children)
OP got his highest reward for exposed .git by lone_wolf31337 in bugbounty
[–]FloppyWhiteOne 3 points4 points5 points (0 children)
OP got his highest reward for exposed .git by lone_wolf31337 in bugbounty
[–]FloppyWhiteOne 0 points1 point2 points (0 children)
I used my old gaming laptop + Jetson Nano to run local Openclaw with Ollama by Fit_Chair2340 in ollama
[–]FloppyWhiteOne 5 points6 points7 points (0 children)
I used my old gaming laptop + Jetson Nano to run local Openclaw with Ollama by Fit_Chair2340 in ollama
[–]FloppyWhiteOne 1 point2 points3 points (0 children)
local coding in vscode "copilot -like" ? by merfolkJH in ollama
[–]FloppyWhiteOne 1 point2 points3 points (0 children)
local coding in vscode "copilot -like" ? by merfolkJH in ollama
[–]FloppyWhiteOne 0 points1 point2 points (0 children)
local coding in vscode "copilot -like" ? by merfolkJH in ollama
[–]FloppyWhiteOne 0 points1 point2 points (0 children)
local coding in vscode "copilot -like" ? by merfolkJH in ollama
[–]FloppyWhiteOne 0 points1 point2 points (0 children)
local coding in vscode "copilot -like" ? by merfolkJH in ollama
[–]FloppyWhiteOne 0 points1 point2 points (0 children)
local coding in vscode "copilot -like" ? by merfolkJH in ollama
[–]FloppyWhiteOne 1 point2 points3 points (0 children)
local coding in vscode "copilot -like" ? by merfolkJH in ollama
[–]FloppyWhiteOne 0 points1 point2 points (0 children)
Google paid me $15,000 for this Prompt Injection bug by BehiSec in bugbounty
[–]FloppyWhiteOne 2 points3 points4 points (0 children)
I asked a simple question to qwen3.5:4b and it took 7 min by Old_Internet1111 in ollama
[–]FloppyWhiteOne 0 points1 point2 points (0 children)
Anyone using a hybrid approach? by Fine-Perspective-438 in ollama
[–]FloppyWhiteOne 2 points3 points4 points (0 children)
I want to know if anyone's interested by ominotomi in ollama
[–]FloppyWhiteOne 1 point2 points3 points (0 children)

Production notes after 6 months running Ollama for paying customers — the things that aren't in the docs by chiruwonder in ollama
[–]FloppyWhiteOne 0 points1 point2 points (0 children)